Spectral feature projections that maximize Shannon mutual information with class labels

نویسندگان

  • Umut Ozertem
  • Deniz Erdogmus
  • Robert Jenssen
چکیده

Determining optimal subspace projections that can maintain task-relevant information in the data is an important problem in machine learning and pattern recognition. In this paper, we propose a nonparametric nonlinear subspace projection technique that maintains class separability maximally under the Shannon mutual information (MI) criterion. Employing kernel density estimates for nonparametric estimation of MI makes possible an interesting marriage of kernel density estimation-based information theoretic methods and kernel machines, which have the ability to determine nonparametric nonlinear solutions for difficult problems in machine learning. Significant computational savings are achieved by translating the definition of the desired projection into the kernel-induced feature space, which leads to obtain analytical solution. 2006 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparison of Linear ICA and Local Linear ICA for Mutual Information Based Feature Ranking

Feature selection and dimensionality reduction is important for high dimensional signal processing and pattern recognition problems. Feature selection can be achieved by filter approach, in which certain criteria must be optimized. By using mutual information (MI) between feature vectors and class labels as the criterion, we proposed an ICA-MI framework for feature selection. In this paper, we ...

متن کامل

Mutual Information Feature Extractors for Neural Classiiers

This paper presents and evaluates two linear feature extractors based on mutual information. These feature extractors consider general dependencies between features and class labels, as opposed to statistical techniques such as PCA which does not consider class labels and LDA, which uses only simple rst order dependencies. As evidenced by several simulations on high dimensional data sets, the p...

متن کامل

Mutual Information in Learning Feature Transformations

We present feature transformations useful for exploratory data analysis or for pattern recognition. Transformations are learned from example data sets by maximizing the mutual information between transformed data and their class labels. We make use of Renyi’s quadratic entropy, and we extend the work of Principe et al. to mutual information between continuous multidimensional variables and disc...

متن کامل

Linear feature extractors based on mutual information

This paper presents and evaluates two linear feature extractors based on mutual information. These feature extractors consider general dependencies between features and class labels, as opposed to well known linear methods such as PCA which does not consider class labels and LDA, which uses only simple low order dependencies. As evidenced by several simulations on high dimensional data sets, th...

متن کامل

Optimum Spatio-Spectral Filtering Network for Brain-Computer Interface

This paper proposes a feature extraction method for motor imagery brain-computer interface (BCI) using electroencephalogram. We consider the primary neurophysiologic phenomenon of motor imagery, termed event-related desynchronization, and formulate the learning task for feature extraction as maximizing the mutual information between the spatio-spectral filtering parameters and the class labels....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Pattern Recognition

دوره 39  شماره 

صفحات  -

تاریخ انتشار 2006